[michelle]
I know very little about neural networks, but I'd like to know more. My engineering side likes to learn by implementing things, so I'm proposing some kind of project where I implement a (probably very primitive) neural network. I'd focus on a particular musical problem (TBD) that would help to demonstrate some of the strengths and weaknesses of the neural network approach.
[danny]
Cool Michelle, I've done a few things with Neural Nets in the past... nothing too complex. Let me know when you have a particular problem in mind if I can help out at all.
[jerry]
So Michelle, any further thoughts, after looking through the Connectionism in Music book? As I mentioned to you, the Todd chapter on composing music with neural nets is one I've spent some time with; but there are other chapters one might use as jumping-off points too.
[michelle]
I read through Mark Dolson's chapter (the first one), and then I started on the Todd chapter. I felt sort of lost though - the ideas were interesting from a theoretical perspective, but I couldn't really relate to what they were saying about the challenges of accomplishing their goals with neural networks since I still didn't really understand how the networks worked.
I then took a step back and implemented a very primitive perceptron in Java - for me that made many things much clearer (in particular, how the learning/training works). The following website was a useful reference for that endeavor: http://www.iiit.net/~vikram/nn_intro.html
I trained the perceptron to compute "OR" and "AND" and "NOT" and then moved on to the first (and most basic) example that Dolson showed, which was determining if one number was at least two times as large as a second number.
Now that I feel like I understand more clearly how neural networks function, I'm looking again at the musical applications of them and trying to relate those applications and the more complicated networks to the simple ones that I played with. Danny was kind enough to point me to a neural network library/framework in Java called Joone (Java Object Oriented Neural Engine). I'm going to look into that and see if I can use it to play with some of these more complex music-oriented networks. If Joone doesn't seem like the right thing, then Danny also mentioned a couple of other libraries (in C/C++ maybe?), so maybe he will add a comment here and list some of them for reference (hint hint).
In summary, this week's efforts were almost all computer science focused and not music focused, but I think it was time well spent. :)
[danny]
Hint taken. The other NNet library I mentioned was FANN (Fast Artificial Neural Network) in C/C++. There is also a Max/MSP implementation of FANN if you are interested (search maxobjects.com). I believe that Joone will have more options for constructing temporally meaningful networks, although FANN can implement a Time Delay Neural Net.
[michelle]
I started playing with Joone, but it didn't work well for me. I was really excited at first because it comes with what appeared to be a nice GUI for dragging around various network components and building a network graphically. The "help" documentation gave a little tutorial on how to build a network that would solve the XOR problem, so I did that. When I tried to extend that to something with more than 2 inputs though, the GUI would freeze for long periods of time, or take a *really* long time to actually run through training examples, and the results didn't seem to be converging the way they should. At that point the "help" docs were pretty much useless, since they're very out of date and don't describe about half of the mysterious parameters that could be tweaked. Very disappointing. I also tried building the actual Joone source code, but it seemed that all of the required pieces weren't there. Maybe I missed something.
Anyway, I got this great book from the library called "Machine Learning" by Tom Mitchell, and it has a very clear description of the back-propagation learning algorithm and how to build a basic feed-forward network. So I went and implemented my own. :P It's now working, and I have it running a version of an example of rhythm classification from the Dolson paper. Unfortunately, most musical things have a time dimension to them, and the feed-forward neural network has no sense of time/history, so I can't do any of the really cool things yet. I think I'll try to turn my feed-forward network into one with feedback and see what happens (maybe I'll find that it's too much of a pain or that I don't have enough references to work with, but it's worth a try).
One question I have: it's largely a mystery to me right how one determines how many hidden layers and how many "neurons" in each of those layers are needed in a network to accomplish a particular task. Does anyone have a good explanation of that, or a reference to one? Or is it really mostly trial and error?
I know very little about neural networks, but I'd like to know more. My engineering side likes to learn by implementing things, so I'm proposing some kind of project where I implement a (probably very primitive) neural network. I'd focus on a particular musical problem (TBD) that would help to demonstrate some of the strengths and weaknesses of the neural network approach.
[danny]
Cool Michelle, I've done a few things with Neural Nets in the past... nothing too complex. Let me know when you have a particular problem in mind if I can help out at all.
[jerry]
So Michelle, any further thoughts, after looking through the Connectionism in Music book? As I mentioned to you, the Todd chapter on composing music with neural nets is one I've spent some time with; but there are other chapters one might use as jumping-off points too.
[michelle]
I read through Mark Dolson's chapter (the first one), and then I started on the Todd chapter. I felt sort of lost though - the ideas were interesting from a theoretical perspective, but I couldn't really relate to what they were saying about the challenges of accomplishing their goals with neural networks since I still didn't really understand how the networks worked.
I then took a step back and implemented a very primitive perceptron in Java - for me that made many things much clearer (in particular, how the learning/training works). The following website was a useful reference for that endeavor:
http://www.iiit.net/~vikram/nn_intro.html
I trained the perceptron to compute "OR" and "AND" and "NOT" and then moved on to the first (and most basic) example that Dolson showed, which was determining if one number was at least two times as large as a second number.
Now that I feel like I understand more clearly how neural networks function, I'm looking again at the musical applications of them and trying to relate those applications and the more complicated networks to the simple ones that I played with. Danny was kind enough to point me to a neural network library/framework in Java called Joone (Java Object Oriented Neural Engine). I'm going to look into that and see if I can use it to play with some of these more complex music-oriented networks. If Joone doesn't seem like the right thing, then Danny also mentioned a couple of other libraries (in C/C++ maybe?), so maybe he will add a comment here and list some of them for reference (hint hint).
In summary, this week's efforts were almost all computer science focused and not music focused, but I think it was time well spent. :)
[danny]
Hint taken. The other NNet library I mentioned was FANN (Fast Artificial Neural Network) in C/C++. There is also a Max/MSP implementation of FANN if you are interested (search maxobjects.com). I believe that Joone will have more options for constructing temporally meaningful networks, although FANN can implement a Time Delay Neural Net.
[michelle]
I started playing with Joone, but it didn't work well for me. I was really excited at first because it comes with what appeared to be a nice GUI for dragging around various network components and building a network graphically. The "help" documentation gave a little tutorial on how to build a network that would solve the XOR problem, so I did that. When I tried to extend that to something with more than 2 inputs though, the GUI would freeze for long periods of time, or take a *really* long time to actually run through training examples, and the results didn't seem to be converging the way they should. At that point the "help" docs were pretty much useless, since they're very out of date and don't describe about half of the mysterious parameters that could be tweaked. Very disappointing. I also tried building the actual Joone source code, but it seemed that all of the required pieces weren't there. Maybe I missed something.
Anyway, I got this great book from the library called "Machine Learning" by Tom Mitchell, and it has a very clear description of the back-propagation learning algorithm and how to build a basic feed-forward network. So I went and implemented my own. :P It's now working, and I have it running a version of an example of rhythm classification from the Dolson paper. Unfortunately, most musical things have a time dimension to them, and the feed-forward neural network has no sense of time/history, so I can't do any of the really cool things yet. I think I'll try to turn my feed-forward network into one with feedback and see what happens (maybe I'll find that it's too much of a pain or that I don't have enough references to work with, but it's worth a try).
One question I have: it's largely a mystery to me right how one determines how many hidden layers and how many "neurons" in each of those layers are needed in a network to accomplish a particular task. Does anyone have a good explanation of that, or a reference to one? Or is it really mostly trial and error?